Learning to Importance Sample in Primary Sample Space
نویسندگان
چکیده
منابع مشابه
Learning When to Reject an Importance Sample
When observations are incomplete or data are missing, approximate inference methods based on importance sampling are often used. Unfortunately, when the target and proposal distributions are dissimilar, the sampling procedure leads to biased estimates or requires a prohibitive number of samples. Our method approximates a multivariate target distribution by sampling from an existing, sequential ...
متن کاملLearning Where to Sample in Structured Prediction
In structured prediction, most inference algorithms allocate a homogeneous amount of computation to all parts of the output, which can be wasteful when different parts vary widely in terms of difficulty. In this paper, we propose a heterogeneous approach that dynamically allocates computation to the different parts. Given a pre-trained model, we tune its inference algorithm (a sampler) to incre...
متن کاملLearning mechanisms in matching to sample.
A model system and an experiment on early learning and decision processes in matching-to-sample and oddity-from-sample tasks are presented. The model system is based, in part, on videotaped records of pigeons' looking responses before they chose 1 of 2 comparison stimuli. In order to see the wavelength stimuli recessed behind the pecking keys, the pigeons had to move in front of them. Although ...
متن کاملThe sample size required in importance sampling
The goal of importance sampling is to estimate the expected value of a given function with respect to a probability measure ν using a random sample of size n drawn from a different probability measure μ. If the two measures μ and ν are nearly singular with respect to each other, which is often the case in practice, the sample size required for accurate estimation is large. In this article it is...
متن کاملSample Importance in Training Deep Neural Networks
The contribution of each sample during model training varies across training iterations and the model’s parameters. We define the concept of sample importance as the change in parameters induced by a sample. In this paper, we explored the sample importance in training deep neural networks using stochastic gradient descent. We found that “easy” samples – samples that are correctly and confidentl...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Computer Graphics Forum
سال: 2019
ISSN: 0167-7055,1467-8659
DOI: 10.1111/cgf.13628